Discriminality-driven regularization framework for indefinite kernel machine

نویسندگان

  • Hui Xue
  • Songcan Chen
چکیده

Indefinite kernel machines have attracted more and more interests in machine learning due to their better empirical classification performance than the common positive definite kernel machines in many applications. A key to implement effective kernel machine is how to use prior knowledge as sufficiently as possible to guide the appropriate construction of the kernels. However, most of existing indefinite kernel machines actually utilize the knowledge involved in data such as discriminative and structural information insufficiently and thus construct the indefinite kernels empirically. Discriminatively regularized least-squares classification (DRLSC) is a recently-proposed supervised classification method which provides a new discriminality-driven regularizer to encourage the discriminality of the classifier rather than the common smoothness. In this paper, we rigorously validate that the discriminative regularizer actually coincides with the definition on the inner product in Reproducing Kernel Kreǐn Space (RKKS) naturally. As a result, we further present a new discriminality-driven regularization framework for indefinite kernel machine based on the discriminative regularizer. According to the framework, we firstly reintroduce the original DRLSC from the viewpoint of the proper indefinite kernelization rather than the empirical kernel mapping. Then a novel semi-supervised algorithm is proposed in terms of different definition on the regularizer. The experiments on both toy and realworld datasets demonstrate the superiority of the two algorithms. & 2014 Elsevier B.V. All rights reserved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Indefinite Kernel Network with Dependent Sampling

We study the asymptotical properties of indefinite kernel network with coefficient regularization and dependent sampling. The framework under investigation is different from classical kernel learning. Positive definiteness is not required by the kernel function and the samples are allowed to be weakly dependent with the dependence measured by a strong mixing condition. By a new kernel decomposi...

متن کامل

Multiple Indefinite Kernel Learning for Feature Selection

Multiple kernel learning for feature selection (MKLFS) utilizes kernels to explore complex properties of features and performs better in embedded methods. However, the kernels in MKL-FS are generally limited to be positive definite. In fact, indefinite kernels often emerge in actual applications and can achieve better empirical performance. But due to the non-convexity of indefinite kernels, ex...

متن کامل

Learning SVM Classifiers with Indefinite Kernels

Recently, training support vector machines with indefinite kernels has attracted great attention in the machine learning community. In this paper, we tackle this problem by formulating a joint optimization model over SVM classifications and kernel principal component analysis. We first reformulate the kernel principal component analysis as a general kernel transformation framework, and then inc...

متن کامل

Splines with non positive kernels

Non parametric regression methods can be presented in two main clusters. The one of smoothing splines methods requiring positive kernels and the other one known as Nonparametric Kernel Regression allowing the use of non positive kernels such as the Epanechnikov kernel. We propose a generalization of the smoothing spline method to include kernels which are still symmetric but not positive semi d...

متن کامل

Kernel MSE Algorithm: A Unified Framework for KFD, LS-SVM and KRR

In this paper, we generalize the conventional minimum squared error (MSE) method to yield a new nonlinear learning machine by using the kernel idea and adding different regularization terms. We name it as kernel minimum squared error or KMSE algorithm, which can deal with linear and nonlinear classification and regression problems. With proper choices of the output coding schemes and regulariza...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neurocomputing

دوره 133  شماره 

صفحات  -

تاریخ انتشار 2014